AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Small parameter optimization

# Small parameter optimization

Microllama
Apache-2.0
MicroLlama is a 300-million-parameter Llama model pretrained by individual developer keeeeenw within a $500 budget, focusing on English text generation tasks.
Large Language Model Transformers English
M
keeeeenw
2,955
46
Gpt2 Small Cs
A small version of GPT-2 pre-trained on 115GB of cleaned Czech text, suitable for Czech text generation tasks
Large Language Model Transformers Other
G
fav-kky
135
2
2chan Rugpt3 Small
ruGPT3-small is a small Russian language model trained on partial 2chan posts, suitable for text generation tasks.
Large Language Model
2
TheBakerCat
20
0
Sroberta L
Apache-2.0
RoBERTa language model trained on Croatian and Serbian, using 6GB dataset for 500,000 steps
Large Language Model Transformers Other
S
Andrija
17
0
Legal T5 Small Trans Es En Small Finetuned
This model is used for translating legal texts from Spanish to English, optimized and trained based on the T5-small architecture.
Machine Translation
L
SEBIS
44
0
Legal T5 Small Trans Cs Sv
Small T5 model for translating legal texts from Czech to Swedish
Machine Translation
L
SEBIS
17
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase